Pii: S0893-6080(00)00024-1

نویسندگان

  • E. F. Gad
  • A. F. Atiya
  • S. Shaheen
  • A. El-Dessouki
چکیده

Piecewise-linear (PWL) neural networks are widely known for their amenability to digital implementation. This paper presents a new algorithm for learning in PWL networks consisting of a single hidden layer. The approach adopted is based upon constructing a continuous PWL error function and developing an efficient algorithm to minimize it. The algorithm consists of two basic stages in searching the weight space. The first stage of the optimization algorithm is used to locate a point in the weight space representing the intersection of N linearly independent hyperplanes, with N being the number of weights in the network. The second stage is then called to use this point as a starting point in order to continue searching by moving along the single-dimension boundaries between the different linear regions of the error function, hopping from one point (representing the intersection of N hyperplanes) to another. The proposed algorithm exhibits significantly accelerated convergence, as compared to standard algorithms such as back-propagation and improved versions of it, such as the conjugate gradient algorithm. In addition, it has the distinct advantage that there are no parameters to adjust, and therefore there is no time-consuming parameters tuning step. The new algorithm is expected to find applications in function approximation, time series prediction and binary classification problems. q 2000 Elsevier Science Ltd. All rights reserved.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Pii: S0893-6080(00)00062-9

This article gives an overview of the different functional brain imaging methods, the kinds of questions these methods try to address and some of the questions associated with functional neuroimaging data for which neural modeling must be employed to provide reasonable answers. q 2000 Published by Elsevier Science Ltd.

متن کامل

Pii: S0893-6080(99)00024-6

This article addresses weighting and partitioning, in complex reinforcement learning tasks, with the aim of facilitating learning. The article presents some ideas regarding weighting of multiple agents and extends them into partitioning an input/state space into multiple regions with differential weighting in these regions, to exploit differential characteristics of regions and differential cha...

متن کامل

Best approximation by Heaviside perceptron networks

In Lp-spaces with p an integer from [1, infinity) there exists a best approximation mapping to the set of functions computable by Heaviside perceptron networks with n hidden units; however for p an integer from (1, infinity) such best approximation is not unique and cannot be continuous.

متن کامل

Multi-agent reinforcement learning: weighting and partitioning

This article addresses weighting and partitioning, in complex reinforcement learning tasks, with the aim of facilitating learning. The article presents some ideas regarding weighting of multiple agents and extends them into partitioning an input/state space into multiple regions with differential weighting in these regions, to exploit differential characteristics of regions and differential cha...

متن کامل

Assessing interactions among neuronal systems using functional neuroimaging

We show that new methods for measuring effective connectivity allow us to characterise the interactions between brain regions that underlie the complex interactions among different processing stages of functional architectures.

متن کامل

Pii: S0893-6080(00)00043-5

It is demonstrated that rotational invariance and reflection symmetry of image classifiers lead to a reduction in the number of free parameters in the classifier. When used in adaptive detectors, e.g. neural networks, this may be used to decrease the number of training samples necessary to learn a given classification task, or to improve generalization of the neural network. Notably, the symmet...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2000